Research Letter
Abstract
This mixed methods pilot study evaluates the feasibility and effectiveness of microlearning for faculty development in cardiovascular education. Microlearning appears feasible and well-received for faculty development, offering a scalable, flexible approach.
JMIR Med Educ 2026;12:e87980doi:10.2196/87980
Keywords
Introduction
Traditional faculty development relies on time-intensive, in-person sessions, limiting participation for clinician-educators balancing clinical and administrative duties [,].
Microlearning delivers brief, focused segments (≤15 min) aligned with objectives [,]. Grounded in cognitive science, it reduces cognitive load and enhances retention by presenting information in small units []. For busy clinician-educators, microlearning offers flexible, asynchronous learning that addresses time constraints and supports targeted, on-demand modules for skill application [,,], and it improves engagement and aligns with digital learning preferences [,].
However, research has focused only on the short-term outcomes of microlearning, leaving gaps in sustained knowledge transfer and behavioral change [,,]. Without accessible and time-efficient faculty development approaches, clinician-educators may continue to rely on informal or inconsistent training, potentially compromising the quality of educational assessments, learner outcomes, and the validity of continuing medical education (CME) activities.
We selected CME multiple‑choice question (MCQ) development as our intervention topic based on an internal needs analysis indicating that faculty responsible for creating CME assessment items receive little or no structured guidance in item writing. This task was identified locally as a high‑frequency responsibility in which faculty desired more support. MCQ development serves as a practical context to evaluate microlearning for clinician‑educators.
This mixed methods pilot evaluates microlearning feasibility and effectiveness in cardiovascular faculty development by assessing learning transfer, satisfaction, and 4‑month knowledge application.
Methods
We conducted a sequential explanatory study guided by Kirkpatrick’s framework [] to assess a microlearning module for faculty development in cardiovascular education. The Mayo Clinic Institutional Review Board reviewed the study and determined it to be exempt. Participants provided informed consent and could withdraw any time. Those completing all components received US $200 remuneration. All data were deidentified. Quantitative analysis was prioritized, with qualitative interviews used to explain and contextualize test results.
Cardiology clinician‑educators responsible for CME MCQs were recruited via email. Eligibility required current responsibility for board‑style MCQ development and no formal training or related coursework within 6 months. Of the 75 identified faculty, 34 met the criteria; 8 enrolled and completed all the components.
The pretest (18 items across 4 sections; total 400 points, scored in the learning management system) preceded each corresponding microlearning segment (). The module comprised 4 short videos, a quick‑reference guide, and an MCQ‑writing template, delivered asynchronously via the cardiology CME learning management system for 3 months. After a 15‑item satisfaction survey (), access was discontinued. An identical posttest occurred 4 months after completion to assess retention; 2 weeks later, semistructured Teams interviews explored application and perceptions ().
We compared pre/post scores using the 2‑sided Wilcoxon signed-rank test (reporting Pratt method including ties), calculated matched-pairs rank-biserial effect sizes, and summarized distributions with medians (IQRs).
Survey responses were summarized descriptively and used to inform preliminary qualitative themes. We conducted a structured hybrid deductive–inductive analysis. The lead investigator completed line‑by‑line coding by using an a priori codebook derived from the interview guide, microlearning constructs, and Kirkpatrick’s model, with inductive codes added as needed. An education researcher not involved in the study independently reviewed all the coded transcripts. Discrepancies were resolved through consensus, and the codebook was refined iteratively.
Quantitative and qualitative data were integrated by the lead and co-lead investigators, who jointly analyzed test scores, survey responses, and interview transcripts. Quantitative results informed qualitative coding, enabling exploration of trends and outliers. Both strands were interpreted together to provide a comprehensive understanding of the module’s impact.
Results
Among the 8 completers, the median pretest score was 366.07 (IQR 338.93-389.28), and the median posttest score was 400.00 (IQR 361.96-400.00). Of the 8 paired scores, 5 improved, 2 were unchanged (ties), and 1 decreased; thus, informative pairs were n=6.
A Wilcoxon signed-rank test (excluding ties) yielded W=5.0, P=.25; using the Pratt method (including ties), W=7.0, P=.18. The matched-pairs rank-biserial correlation was 0.52, indicating a moderate positive effect of the intervention.
Satisfaction survey responses showed strong agreement across all dimensions, with participants endorsing the module’s relevance, clarity, and flexibility ().
| Survey item | Strongly agree, n (%) | Agree, n (%) |
| Videos were engaging | 6 (75) | 2 (25) |
| Video length was appropriate | 8 (100) | 0 (0) |
| Modular format was effective | 7 (88) | 1 (12) |
| Prepared to write board-style review questions | 6 (75) | 2 (25) |
| Quick reference guides were valuable | 8 (100) | 0 (0) |
| Pretest helped gauge prior knowledge | 5 (63) | 3 (37) |
| Pretest guided learning | 5 (63) | 3 (37) |
| Posttest reinforced learning | 5 (63) | 3 (37) |
Qualitative analysis revealed three main themes: (1) appreciation for the concise, flexible format; (2) direct application of learned principles in educational practice; and (3) high perceived value and satisfaction (). Time constraints remained a barrier to engaging in faculty development, but microlearning was widely endorsed as effective and scalable.
| Theme | Subthemes | Representative findings |
| Value of microlearning format |
| Microlearning was praised for fitting into busy schedules and enabling learning in short, focused bursts |
| Knowledge application and transfer |
| Learned principles were immediately used in CME MCQb development and shared with colleagues. Modules will serve as an ongoing reference |
| Perceived value and satisfaction |
| Participants appreciated the clarity, relevance, and production quality of the modules and resources |
| Barriers to faculty development |
| Common barriers included competing clinical/administrative demands and lack of dedicated time for learning |
| Suggestions for improvement |
| Recommendations included hosting on familiar organizational platforms, adding time estimates, and offering structured follow-up with an expert |
aCME: continuing medical education.
bMCQ: multiple choice question.
Discussion
We demonstrated that a microlearning intervention for cardiovascular faculty development was feasible, well-received, and associated with improved knowledge scores and perceived application of skills at 4 months, aligning with the objectives to assess learning transfer, satisfaction, and sustained knowledge application.
Findings align with prior evidence that microlearning enhances engagement, skill acquisition, and retention in health education [,]. The module’s brevity, relevance, and asynchronous access likely contributed to its effectiveness. The 4-month follow-up addresses a gap in prior faculty-focused studies, which often lack longer-term data []. Microlearning can improve learning outcomes and self-efficacy, further supporting its value in CME [].
By integrating quantitative data from pretests and posttests and satisfaction surveys with qualitative insights from interviews, investigators contextualized statistical findings with participant perspectives. Quantitative results informed preliminary qualitative coding, enabling exploration of trends and outliers. This approach provided a more comprehensive understanding of the module’s impact and feasibility, revealing factors and barriers not apparent from a single data type.
Mixed methods revealed both measurable gains and contextual insights, including barriers such as time constraints. Participant recommendations such as hosting modules on familiar platforms and providing time estimates for each section may further improve engagement.
Limitations include the single-institution setting, small sample size, exclusion of tied pairs in the primary analysis, lack of a control group, and potential response bias. The 4-month follow-up may not capture durable behavior change. Integrating quantitative and qualitative methods enhanced understanding but may introduce interpretive complexity.
Overall, microlearning appears to be a scalable, flexible approach to faculty development, well-suited to clinical educators. Institutions should consider implementing microlearning modules with structured follow-up to reinforce learning. Future research should use larger, more diverse samples, include control groups, and extend follow-up to evaluate long-term behavioral and organizational outcomes.
Acknowledgments
The authors would like to thank Metta A Kuehntopp, MEd, and Jeffrey C Williams for their part in the creation of the intervention used in this study. We thank Patricia K Guthrie for her valuable contributions to developing content within our learning management system. No generative artificial intelligence was used in the preparation of this manuscript.
Data Availability
The datasets generated and analyzed during this study are not publicly available because participant consent for data sharing was not obtained. Data may be available from the corresponding author upon reasonable request and with appropriate institutional approvals, provided such sharing complies with participant privacy and ethical guidelines.
Funding
This research was funded by the Mayo Clinic College of Medicine and Science Office of Applied Scholarship and Education Science Endowment for Education Research Award. The study was funded for US $12,000 for a one-year period starting January 2024.
Authors' Contributions
Conceptualization: DLL (lead), MWC (supporting)
Data curation: DLL
Formal analysis: DLL
Funding acquisition: DLL
Investigation: DLL
Methodology: DLL (lead), MWC (supporting)
Project administration: DLL (lead), JBG (supporting), MWC (supporting)
Resources: DLL (lead), JBG (supporting)
Supervision: MWC
Validation: DLL
Visualization: DLL (lead), MWC (supporting)
Writing – original draft: DLL (lead), MWC (supporting)
Writing – review & editing: DLL (lead), MWC (supporting), JBG (supporting), JAL (supporting)
Conflicts of Interest
None declared.
Microlearning pretest and posttest.
DOCX File , 499 KBPost course evaluation.
DOCX File , 31 KBFollow-up interview guide.
DOCX File , 29 KBReferences
- Dyrbye L, Bergene A, Leep HA, Billings H. Reimagining faculty development deployment: a multipronged, pragmatic approach to improve engagement. Acad Med. Sep 2022;97(9):1322-1330. [CrossRef]
- Cook DA, Steinert Y. Online learning for faculty development: a review of the literature. Med Teach. Nov 2013;35(11):930-937. [CrossRef] [Medline]
- De Gagne JC, Park HK, Hall K, Woodward A, Yamane S, Kim SS. Microlearning in health professions education: scoping review. JMIR Med Educ. Jul 23, 2019;5(2):e13997. [FREE Full text] [CrossRef] [Medline]
- Bowler C, Foshee C, Haggar F, Simpson D, Schroedl C, Billings H. Got 15? try faculty development on the Fly: a snippets workshop for microlearning. MedEdPORTAL. Jun 14, 2021;17:11161. [FREE Full text] [CrossRef] [Medline]
- Taylor A, Hung W. The effects of microlearning: a scoping review. Education Tech Research Dev. Jan 26, 2022;70(2):363-395. [CrossRef]
- Monib WK, Qazi A, Apong RA. Microlearning beyond boundaries: a systematic review and a novel framework for improving learning outcomes. Heliyon. Jan 30, 2025;11(2):e41413. [FREE Full text] [CrossRef] [Medline]
- Moore R, Hwang W. A systematic review of mobile-based microlearning in adult learner contexts. Educ Technol Soc. 2024:27-146. [CrossRef]
- Kirkpatrick J, Kirkpatrick W. Kirkpatrick's Four Levels of Training Evaluation. Alexandria, VA. Association for Talent Development; 2016.
- Zarshenas L, Mehrabi M, Karamdar L, Keshavarzi MH, Keshtkaran Z. The effect of micro-learning on learning and self-efficacy of nursing students: an interventional study. BMC Med Educ. Sep 07, 2022;22(1):664. [FREE Full text] [CrossRef] [Medline]
Abbreviations
| CME: continuing medical education |
| MCQ: multiple choice question |
Edited by A Stone; submitted 17.Nov.2025; peer-reviewed by BS Chisholm, S Otero; comments to author 15.Dec.2025; revised version received 18.Feb.2026; accepted 19.Feb.2026; published 11.Mar.2026.
Copyright©Darci L Lammers, Jeffrey B Geske, Jane A Linderbaum, Michael W Cullen. Originally published in JMIR Medical Education (https://mededu.jmir.org), 11.Mar.2026.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on https://mededu.jmir.org/, as well as this copyright and license information must be included.

